13 research outputs found
Do we need scan-matching in radar odometry?
There is a current increase in the development of "4D" Doppler-capable radar
and lidar range sensors that produce 3D point clouds where all points also have
information about the radial velocity relative to the sensor. 4D radars in
particular are interesting for object perception and navigation in
low-visibility conditions (dust, smoke) where lidars and cameras typically
fail. With the advent of high-resolution Doppler-capable radars comes the
possibility of estimating odometry from single point clouds, foregoing the need
for scan registration which is error-prone in feature-sparse field
environments. We compare several odometry estimation methods, from direct
integration of Doppler/IMU data and Kalman filter sensor fusion to 3D
scan-to-scan and scan-to-map registration, on three datasets with data from two
recent 4D radars and two IMUs. Surprisingly, our results show that the odometry
from Doppler and IMU data alone give similar or better results than 3D point
cloud registration. In our experiments, the average position error can be as
low as 0.3% over 1.8 and 4.5km trajectories. That allows accurate estimation of
6DOF ego-motion over long distances also in feature-sparse mine environments.
These results are useful not least for applications of navigation with
resource-constrained robot platforms in feature-sparse and low-visibility
conditions such as mining, construction, and search & rescue operations.Comment: Preprint. Submitted to ICRA 2024. 7 pages, 11 figure
Improving perception and locomotion capabilities of mobile robots in urban search and rescue missions
Nasazení mobilních robotů během zásahů záchranných složek je způsob, jak učinit práci záchranářů bezpečnější a efektivnější. Na roboty jsou ale při takovém použití kladeny vyšší nároky kvůli podmínkám, které při těchto událostech panují. Roboty se musejí pohybovat po nestabilních površích, ve stísněných prostorech nebo v kouři a prachu, což ztěžuje použití některých senzorů. Lokalizace, v robotice běžná úloha spočívající v určení polohy robotu vůči danému souřadnému systému, musí spolehlivě fungovat i za těchto ztížených podmínek. V této dizertační práci popisujeme vývoj lokalizačního systému pásového mobilního robotu, který je určen pro nasazení v případě zemětřesení nebo průmyslové havárie. Nejprve je předveden lokalizační systém, který vychází pouze z měření proprioceptivních senzorů a který vyvstal jako nejlepší varianta při porovnání několika možných uspořádání takového systému. Lokalizace je poté zpřesněna přidáním měření exteroceptivních senzorů, které zpomalují kumulaci nejistoty určení polohy robotu. Zvláštní pozornost je věnována možným výpadkům jednotlivých senzorických modalit, prokluzům pásů, které u tohoto typu robotů nevyhnutelně nastávají, výpočetním nárokům lokalizačního systému a rozdílným vzorkovacím frekvencím jednotlivých senzorů. Dále se věnujeme problému kinematických modelů pro přejíždění vertikálních překážek, což je další zdroj nepřesnosti při lokalizaci pásového robotu. Díky účasti na výzkumných projektech, jejichž členy byly hasičské sbory Itálie, Německa a Nizozemska, jsme měli přístup na cvičiště určená pro přípravu na zásahy během zemětřesení, průmyslových a dopravních nehod. Přesnost našeho lokalizačního systému jsme tedy testovali v podmínkách, které věrně napodobují ty skutečné. Soubory senzorických měření a referenčních poloh, které jsme vytvořili pro testování přesnosti lokalizace, jsou veřejně dostupné a považujeme je za jeden z přínosů naší práce. Tato dizertační práce má podobu souboru tří časopiseckých publikací a jednoho článku, který je v době jejího podání v recenzním řízení.eployment of mobile robots in search and rescue missions is a way to make job of human rescuers safer and more efficient. Such missions, however, require robots to be resilient to harsh conditions of natural disasters or human-inflicted accidents. They have to operate on unstable rough terrain, in confined spaces or in sensory-deprived environments filled with smoke or dust. Localization, a common task in mobile robotics which involves determining position and orientation with respect to a given coordinate frame, faces these conditions as well. In this thesis, we describe development of a localization system for tracked mobile robot intended for search and rescue missions. We present a proprioceptive 6-degrees-of-freedom localization system, which arose from the experimental comparison of several possible sensor fusion architectures. The system was modified to incorporate exteroceptive velocity measurements, which significantly improve accuracy by reducing a localization drift. A special attention was given to potential sensor outages and failures, to track slippage that inevitably occurs with this type of robots, to computational demands of the system and to different sampling rates sensory data arrive with. Additionally, we addressed the problem of kinematic models for tracked odometry on rough terrains containing vertical obstacles. Thanks to research projects the robot was designed for, we had access to training facilities used by fire brigades of Italy, Germany and Netherlands. Accuracy and robustness of proposed localization systems was tested in conditions closely resembling those seen in earthquake aftermath and industrial accidents. Datasets used to test our algorithms are publicly available and they are one of the contributions of this thesis. We form this thesis as a compilation of three published papers and one paper in review process
Doppler-only Single-scan 3D Vehicle Odometry
We present a novel 3D odometry method that recovers the full motion of a
vehicle only from a Doppler-capable range sensor. It leverages the radial
velocities measured from the scene, estimating the sensor's velocity from a
single scan. The vehicle's 3D motion, defined by its linear and angular
velocities, is calculated taking into consideration its kinematic model which
provides a constraint between the velocity measured at the sensor frame and the
vehicle frame.
Experiments carried out prove the viability of our single-sensor method
compared to mounting an additional IMU. Our method provides the translation of
the sensor, which cannot be reliably determined from an IMU, as well as its
rotation. Its short-term accuracy and fast operation (~5ms) make it a proper
candidate to supply the initialization to more complex localization algorithms
or mapping pipelines. Not only does it reduce the error of the mapper, but it
does so at a comparable level of accuracy as an IMU would. All without the need
to mount and calibrate an extra sensor on the vehicle.Comment: This work has been submitted to the IEEE for possible publication.
Copyright may be transferred without notice, after which this version may no
longer be accessibl
Vibration Suppression in Inertial Sensors Signals
katedra měřen
Robust Data Fusion of Multi-modal Sensory Information for Mobile Robots
International audienceUrban Search and Rescue missions for mobile robots require reliable state estimation systems resilient to conditions given by the dynamically changing environment. We design and evaluate a data fusion system for localization of a mobile skid-steer robot intended for USAR missions. We exploit a rich sensor suite including both proprioceptive (inertial measurement unit and tracks odometry) and exteroceptive sensors (omnidirectional camera and rotating laser rangefinder). To cope with the specificities of each sensing modality (such as significantly differing sampling frequencies), we introduce a novel fusion scheme based on Extended Kalman filter for 6DOF orientation and position estimation. We demonstrate the performance on field tests of more than 4.4 km driven under standard USAR conditions. Part of our datasets include ground truth positioning; indoor with a Vicon motion capture system and outdoor with a Leica theodolite tracker. The overall median accuracy of localization—achieved by combining all the four modalities—was 1.2 % and 1.4 % of the total distance traveled, for indoor and outdoor environments respectively. To identify the true limits of the proposed data fusion we propose and employ a novel experimental evaluation procedure based on failure case scenarios. This way we address the common issues like: slippage, reduced camera field of view, limited laser rangefinder range, together with moving obstacles spoiling the metric map. We believe such characterization of the failure cases is a first step towards identifying the behavior of state estimation under such conditions. We release all our datasets to the robotics community for possible benchmarking
Doppler-only Single-scan 3D Vehicle Odometry
Dataset provided with the article of the same name. Created to test the performance of 3D Doppler-capable radar odometry in outdoor scenarios. Sensors mounted on the vehicle include a 3D Doppler-capable radar, 3D lidar, and an IMU